Mansour Baziar
Abstract
Background and Purpose: Nitrates have long been considered indicative of drinking water quality and a critical concern for human health. The evolution of advanced models for water quality management has spurred decision-makers to incorporate artificial intelligence technologies into water quality planning. ...
Read More
Background and Purpose: Nitrates have long been considered indicative of drinking water quality and a critical concern for human health. The evolution of advanced models for water quality management has spurred decision-makers to incorporate artificial intelligence technologies into water quality planning. This study aims to employ the AdaBoost model, one of the cutting-edge models in water quality management, to predict nitrate concentrations in groundwater using pH and EC (Electrical Conductivity) as input variables.Materials and Methods: Initially, the study analyzed the Pearson correlation matrix and subsequently determined the input variables for multiple AdaBoost models with varying hyperparameters. A sensitivity and dependence analysis of the model's input variables was conducted to assess their impact on nitrate prediction.Results: The results obtained from the AdaBoost model reveal R-squared (R2) values of 0.915 for the training dataset and 0.924 for the test dataset. Additionally, the Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and Mean Absolute Percentage Error (MAPE) scores for the training dataset were recorded as 1.02, 1.01, 0.823, and 7.3%, respectively. For the test dataset, these metrics were observed in the order of 0.228, 0.477, 0.375, and 3.2%. The model's sensitivity analysis identified the pH variable as the most influential factor in nitrate prediction.Conclusion: The model analysis demonstrates that the proposed method performs well in predicting nitrate concentrations. This approach holds significant potential for implementation as an intelligent system for forecasting water quality parameters.
Mohsen Niazi; Ali Naghizadeh; Mansour Baziar
Abstract
AbstractBackground and purposeThe turbidity of treated water is measured as an important parameter in determining the quality of drinking or industrial water in all treatment plants. Due to the importance of the prevalence of pathogens such as Giardia and Cryptosporidium, which cause dangerous diseases ...
Read More
AbstractBackground and purposeThe turbidity of treated water is measured as an important parameter in determining the quality of drinking or industrial water in all treatment plants. Due to the importance of the prevalence of pathogens such as Giardia and Cryptosporidium, which cause dangerous diseases such as dysentery, the relationship between reducing turbidity and increasing the elimination of these microorganisms has been proven in studies.Materials and methodsIn this study, an artificial neural network (ANN) model and multiple linear regression(MLR) were developed and their performance was compared to predict the turbidity of treated water of Tabas water treatment plant. Total dissolved solids, pH, temperature and input turbidity of raw water were used as input parameters of the models in the predictions. The best backpropagation algorithm and number of neurons were determined to optimize the model architecture.ResultsThe results showed that the Levenberg–Marquardt algorithm was selected as the best algorithm and the number of optimal neurons was determined to be 16.Also, the results of the sensitivity analysis of the neural network model showed that the input turbidity with a value of 29% is the most important parameter in the development of the ANN model.ConclusionThe results of correlation coefficient of MLR and ANN models were obtained for training data 0.63 and 0.8921 and for testing data 0.60 and 0.8571, respectively, which show the superiority of ANN model in Predicting the turbidity of the output of Tabas water treatment plant.